A Hierarchy of Equivalence Relations for Partially Observable Markov Decision Processes
نویسنده
چکیده
We discuss the problem of comparing the behavioural equivalence of partially observable systems with observations. We examine different types of equivalence relations on states, and show that branching equivalence relations are stronger than linear ones. Finally, we discuss how this hierarchy can be used in duality theory.
منابع مشابه
Equivalence Relations in Fully and Partially Observable Markov Decision Processes
We explore equivalence relations between states in Markov Decision Processes and Partially Observable Markov Decision Processes. We focus on two different equivalence notions: bisimulation [Givan et al., 2003] and a notion of trace equivalence, under which states are considered equivalent if they generate the same conditional probability distributions over observation sequences (where the condi...
متن کاملNotions of State Equivalence under Partial Observability
We explore equivalence relations between states in Markov Decision Processes and Partially Observable Markov Decision Processes. We focus on two different equivalence notions: bisimulation (Givan et al, 2003) and a notion of trace equivalence, under which states are considered equivalent roughly if they generate the same conditional probability distributions over observation sequences (where th...
متن کاملA POMDP Framework to Find Optimal Inspection and Maintenance Policies via Availability and Profit Maximization for Manufacturing Systems
Maintenance can be the factor of either increasing or decreasing system's availability, so it is valuable work to evaluate a maintenance policy from cost and availability point of view, simultaneously and according to decision maker's priorities. This study proposes a Partially Observable Markov Decision Process (POMDP) framework for a partially observable and stochastically deteriorating syste...
متن کاملSynthesis of Hierarchical Finite-State Controllers for POMDPs
We develop a hierarchical approach to planning for partially observable Markov decision processes (POMDPs) in which a policy is represented as a hierarchical finite-state controller. To provide a foundation for this approach, we discuss some extensions of the POMDP framework that allow us to formalize the process of abstraction by which a hierarchical controller is constructed. Then we describe...
متن کاملMobile Robot Control Using a Cloud of Particles
Common control systems for mobile robots include the use of deterministic control laws together with state estimation approaches and the consideration of the certainty equivalence principle. Recent approaches consider the use of partially observable Markov decision process strategies together with Bayesian estimators. In order to reduce the required processing power and yet allow for multimodal...
متن کامل